Incremental classifier learning based on PEDCC-loss and cosine distance
نویسندگان
چکیده
Traditionally, the performance of deep convolutional neural networks relies on a large number labeled datasets in advance. However, real-world applications, training data are not collected at once, so an algorithm which can deal with continuous incoming is needed. This learning method called incremental learning, whose main problem catastrophic forgetting. Neural network will perform badly old classes after for new classes. To solve this problem, paper proposes integrated approach based PEDCC-loss, and cosine distance between sample’s output feature PEDCC (predefined evenly distributed class centroids). Old knowledge learned by stored separately. During each network, PEDCC-Loss used to constrain their corresponding pre-defined center. Meanwhile, retained different rates across retention mode samples discussed. In test phase, final prediction determined distances features all networks. Our experiments EMNIST, CIFAR100 TinyImageNet show that our learn incrementally without quickly failing performance. Compared some existing algorithms, such as Hou, iCaRL finetune, has better
منابع مشابه
A Tweets Classifier based on Cosine Similarity
The 2017 Microblog Cultural Contextualization task consists in three challenges: (1) Content Analysis, (2) Microblog search, and (3) TimeLine illustration. This paper describes the use of cosine similarity, which is characterized by the comparison of similarity between two vectors of an inner product space. This research used two approaches: (1) word2vec and (2) Bag-of-Words (BoW) for extractin...
متن کاملIncremental Learning in a Fril-based Odour Classifier
We describe the application of the Fril data browser to the classification of odours using signals from conducting polymer sensors. The theoretical background to the Fril data browser is explained and its application to a small data set of experimental sensor values is outlined. Further work is underway applying the method to a larger set of epxerimental data. Extensions to the data browser fra...
متن کاملIncremental Classifier Learning with Generative Adversarial Networks
In this paper, we address the incremental classifier learning problem, which suffers from catastrophic forgetting. The main reason for catastrophic forgetting is that the past data are not available during learning. Typical approaches keep some exemplars for the past classes and use distillation regularization to retain the classification capability on the past classes and balance the past and ...
متن کاملA label distance maximum-based classifier for multi-label learning.
Multi-label classification is useful in many bioinformatics tasks such as gene function prediction and protein site localization. This paper presents an improved neural network algorithm, Max Label Distance Back Propagation Algorithm for Multi-Label Classification. The method was formulated by modifying the total error function of the standard BP by adding a penalty term, which was realized by ...
متن کاملIncremental classifier based on a local credibility criterion
In this paper we propose the Local Credibility Concept (LCC), a novel technique for incremental classifiers. It measures the classification rate of the classifier’s local models and ensures that the models do not cross the borders between classes, but allows them to develop freely within the domain of their own class. Thus, we reduce the dependency on the order of training samples, an inherent ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Multimedia Tools and Applications
سال: 2021
ISSN: ['1380-7501', '1573-7721']
DOI: https://doi.org/10.1007/s11042-021-11163-w